Skip to content

Improve validate solutions script & fix pre-commit error #3253

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged

Conversation

dhruvmanila
Copy link
Member

@dhruvmanila dhruvmanila commented Oct 13, 2020

Reason:

I wanted to add the ability to log the time it took for each solution. It started by adding this functionality to the existing script but that got quite cumbersome as you can see in my previous commit on this branch. Then I got an aha moment to change the way I am testing the solution which leads me to this PR.

Describe your change:

I have completely changed the way I was performing the tests. Instead of parameterizing the problem numbers and expected output, I will parametrize the solution file path. This way we won't need the subtests dependency. This also means that we are only testing the files present in the directory and no need to skip for solutions not submitted. Now, we can also look at the time it takes for each solution with the help of the --durations flag from pytest.

Steps:

  • Collect all the solution file paths
  • Convert the paths into a Python module
  • Call solution on the module
  • Assert the answer with the expected results

For getting the expected value, it was needed to convert the JSON list object to a Python dictionary object
which required changing the JSON file itself.

  • Improve an algorithm

Checklist:

  • I have read CONTRIBUTING.md.
  • This pull request is all my own work -- I have not plagiarized.
  • I know that pull requests will not be merged if they fail the automated tests.
  • This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.

- Use pytest fixture along with --capture=no flag to print out the
  top DURATIONS slowest solution at the end of the test sessions.
- Remove the print part and try ... except ... block from the test
  function.
Completely changed the way I was performing the tests. Instead of
parametrizing the problem numbers and expected output, I will
parametrize the solution file path.

Steps:
- Collect all the solution file paths
- Convert the paths into a Python module
- Call solution on the module
- Assert the answer with the expected results

For assertion, it was needed to convert the JSON list object to
Python dictionary object which required changing the JSON file itself.
@dhruvmanila
Copy link
Member Author

Wow, thanks for such a quick response!

@dhruvmanila dhruvmanila merged commit 29b32d3 into TheAlgorithms:master Oct 13, 2020
@cclauss
Copy link
Member

cclauss commented Oct 13, 2020

This is great work!

@dhruvmanila dhruvmanila deleted the improve-validate-solutions branch October 13, 2020 11:59
stokhos pushed a commit to stokhos/Python that referenced this pull request Jan 3, 2021
…ms#3253)

* Trying to time every solution

* Proposal 2 for timing PE solutions:

- Use pytest fixture along with --capture=no flag to print out the
  top DURATIONS slowest solution at the end of the test sessions.
- Remove the print part and try ... except ... block from the test
  function.

* Proposal 3 for timing PE solutions:

Completely changed the way I was performing the tests. Instead of
parametrizing the problem numbers and expected output, I will
parametrize the solution file path.

Steps:
- Collect all the solution file paths
- Convert the paths into a Python module
- Call solution on the module
- Assert the answer with the expected results

For assertion, it was needed to convert the JSON list object to
Python dictionary object which required changing the JSON file itself.

* Add type hints for variables

* Fix whitespace in single_qubit_measure
Panquesito7 pushed a commit to Panquesito7/Python that referenced this pull request May 13, 2021
…ms#3253)

* Trying to time every solution

* Proposal 2 for timing PE solutions:

- Use pytest fixture along with --capture=no flag to print out the
  top DURATIONS slowest solution at the end of the test sessions.
- Remove the print part and try ... except ... block from the test
  function.

* Proposal 3 for timing PE solutions:

Completely changed the way I was performing the tests. Instead of
parametrizing the problem numbers and expected output, I will
parametrize the solution file path.

Steps:
- Collect all the solution file paths
- Convert the paths into a Python module
- Call solution on the module
- Assert the answer with the expected results

For assertion, it was needed to convert the JSON list object to
Python dictionary object which required changing the JSON file itself.

* Add type hints for variables

* Fix whitespace in single_qubit_measure
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants